Search Results

Documents authored by Dereziński, Michał


Document
Domain Sparsification of Discrete Distributions Using Entropic Independence

Authors: Nima Anari, Michał Dereziński, Thuy-Duong Vuong, and Elizabeth Yang

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We present a framework for speeding up the time it takes to sample from discrete distributions μ defined over subsets of size k of a ground set of n elements, in the regime where k is much smaller than n. We show that if one has access to estimates of marginals P_{S∼ μ} {i ∈ S}, then the task of sampling from μ can be reduced to sampling from related distributions ν supported on size k subsets of a ground set of only n^{1-α}⋅ poly(k) elements. Here, 1/α ∈ [1, k] is the parameter of entropic independence for μ. Further, our algorithm only requires sparsified distributions ν that are obtained by applying a sparse (mostly 0) external field to μ, an operation that for many distributions μ of interest, retains algorithmic tractability of sampling from ν. This phenomenon, which we dub domain sparsification, allows us to pay a one-time cost of estimating the marginals of μ, and in return reduce the amortized cost needed to produce many samples from the distribution μ, as is often needed in upstream tasks such as counting and inference. For a wide range of distributions where α = Ω(1), our result reduces the domain size, and as a corollary, the cost-per-sample, by a poly(n) factor. Examples include monomers in a monomer-dimer system, non-symmetric determinantal point processes, and partition-constrained Strongly Rayleigh measures. Our work significantly extends the reach of prior work of Anari and Dereziński who obtained domain sparsification for distributions with a log-concave generating polynomial (corresponding to α = 1). As a corollary of our new analysis techniques, we also obtain a less stringent requirement on the accuracy of marginal estimates even for the case of log-concave polynomials; roughly speaking, we show that constant-factor approximation is enough for domain sparsification, improving over O(1/k) relative error established in prior work.

Cite as

Nima Anari, Michał Dereziński, Thuy-Duong Vuong, and Elizabeth Yang. Domain Sparsification of Discrete Distributions Using Entropic Independence. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 5:1-5:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{anari_et_al:LIPIcs.ITCS.2022.5,
  author =	{Anari, Nima and Derezi\'{n}ski, Micha{\l} and Vuong, Thuy-Duong and Yang, Elizabeth},
  title =	{{Domain Sparsification of Discrete Distributions Using Entropic Independence}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{5:1--5:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.5},
  URN =		{urn:nbn:de:0030-drops-156013},
  doi =		{10.4230/LIPIcs.ITCS.2022.5},
  annote =	{Keywords: Domain Sparsification, Markov Chains, Sampling, Entropic Independence}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail